(Nieman Lab) Disinformation spread online is so disorienting that it’s messing with the researchers who study it

This week I got to hear Kate Starbird, assistant professor at the University of Washington and director of its Emerging Capacities of Mass Participation (emCOMP) Laboratory, speak about her research into how online disinformation spreads during crisis events (like shootings and terrorist attacks) and what she’s learned about the networks spreading this information and the tactics that they use.

A few of the intriguing bits from Starbird’s talk:

— She and her team have looked a lot at the language that conspiracy theorists use both in tweets and on sites like 21stCenturyWire.com. This is “question-mark language,” Starbird said. “‘I’m not gonna tell you what to think, I’m just gonna put the evidence out there and you can make up your mind yourself’ — this way of talking persists across different events” from Sandy Hook to the Boston Marathon bombing to the Orlando shooting.

— Starbird spent a lot of the time reading the sites that were spreading these conspiracy theory posts — the sites behind the links being tweeted out. (“I do not recommend this.) Stuff she looked at: Homepages, about pages, ownership, authors, common themes and stories. She developed coding schemes for theme, political view, and so on. Common themes: “aliens, anti-big pharma, chemtrails, anti-corporate media, geo-engineering, George Soros, anti-globalist, anti-GMO, Flat Earth, Illuminati, Koch Brothers, anti-media, 9-11 truth, New World Order Cabal, nutritional supplements, pedophile rings, Rothschilds, anti-vaccine, anti-Zionist.” (On the subject of GMOs, by the way, please read this tweet thread, which is not about conspiracy theories but is really interesting to keep in mind as you read about Starbird’s work.)

Read it all.

print

Posted in --Social Networking, Anthropology, Blogging & the Internet, Ethics / Moral Theology, Psychology, Science & Technology, Theology